skip to main content


Search for: All records

Creators/Authors contains: "Coffman, Austin R."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We consider the problem of optimal control of district cooling energy plants (DCEPs) consisting of multiple chillers, a cooling tower, and a thermal energy storage (TES), in the presence of time-varying electricity price. A straightforward application of model predictive control (MPC) requires solving a challenging mixed-integer nonlinear program (MINLP) because of the on/off of chillers and the complexity of the DCEP model. Reinforcement learning (RL) is an attractive alternative since its real-time control computation is much simpler. But designing an RL controller is challenging due to myriad design choices and computationally intensive training. In this paper, we propose an RL controller and an MPC controller for minimizing the electricity cost of a DCEP and compare them via simulations. The two controllers are designed to be comparable in terms of objective and information requirements. The RL controller uses a novel Q-learning algorithm that is based on least-squares policy iteration. We describe the design choices for the RL controller, including the choice of state space and basis functions, that are found to be effective. The proposed MPC controller does not need a mixed integer solver for implementation, but only a nonlinear program (NLP) solver. A rule-based baseline controller is also proposed to aid in comparison. Simulation results show that the proposed RL and MPC controllers achieve similar savings over the baseline controller, about 17%. 
    more » « less
    Free, publicly-accessible full text available November 7, 2024
  2. null (Ed.)
    Flexible loads are a resource for the Balancing Authority (BA) of the future to aid in the balance of power supply and demand. In order to be used as a resource, the BA must know the capacity of the flexible loads to vary their power demand over a baseline without violating consumers' quality of service (QoS). Existing work on capacity characterization is model-based: They need models relating power consumption to variables that dictate QoS, such as temperature in the case of an air conditioning system. However, in many cases the model parameters are not known or are difficult to obtain. In this work, we pose a data driven capacity characterization method that does not require model information, it only needs access to a simulator. The capacity is characterized as the set of feasible spectral densities (SDs) of the demand deviation. The proposed method is an extension of our recent work on SD-based capacity characterization that was limited to the case where the loads dynamic model is completely known. Numerical evaluation of the method is provided, which compares our approach to the model-based solution of our past work. 
    more » « less
  3. null (Ed.)
  4. null (Ed.)
  5. null (Ed.)
  6. null (Ed.)